Перевод: с русского на все языки

со всех языков на русский

Markovian process

См. также в других словарях:

  • Markovian process — Markovo vyksmas statusas T sritis fizika atitikmenys: angl. Markov process; Markovian process vok. Markow Prozeß, m; Markowscher Prozeß, m rus. марковский процесс, m; процесс Маркова, m pranc. processus de Markoff, m; processus marcovien, m;… …   Fizikos terminų žodynas

  • Markovian arrival processes — In queueing theory, Markovian arrival processes are used to model the arrival of customers to a queue. Some of the most common include the Poisson process, Markov arrival process and the batch Markov arrival process. Contents 1 Background 2… …   Wikipedia

  • Markovian — adjective relating to or generated by a Markov process • Pertains to noun: ↑Markov process • Derivationally related forms: ↑Markov …   Useful english dictionary

  • Markovian — or Markov; also Markoff adjective Date: 1944 of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states …   New Collegiate Dictionary

  • Markovian — adjective maɹˈkoʊviən Exhibiting the , in which the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states …   Wiktionary

  • Markov process — In probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time varying random phenomenon for which a specific property (the Markov property) holds. In a common description, a stochastic… …   Wikipedia

  • Markov process — Markovo vyksmas statusas T sritis fizika atitikmenys: angl. Markov process; Markovian process vok. Markow Prozeß, m; Markowscher Prozeß, m rus. марковский процесс, m; процесс Маркова, m pranc. processus de Markoff, m; processus marcovien, m;… …   Fizikos terminų žodynas

  • Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… …   Wikipedia

  • Poisson process — A Poisson process, named after the French mathematician Siméon Denis Poisson (1781 ndash; 1840), is the stochastic process in which events occur continuously and independently of one another (the word event used here is not an instance of the… …   Wikipedia

  • Partially observable Markov decision process — A Partially Observable Markov Decision Process (POMDP) is a generalization of a Markov Decision Process. A POMDP models an agent decision process in which it is assumed that the system dynamics are determined by an MDP, but the agent cannot… …   Wikipedia

  • Markov additive process — In applied probability, a Markov additive process (MAP) {(X(t),J(t)) : t ≥ 0} is a bivariate Markov process whose transition probability measure is translation invariant in the additive component X(t). That is to say, the… …   Wikipedia

Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»